摘要 :
Data stores and cloud services are typically accessed using a client-server paradigm wherein the client runs as part of an application process which is trying to access the data store or cloud service. This paper presents the desi...
展开
Data stores and cloud services are typically accessed using a client-server paradigm wherein the client runs as part of an application process which is trying to access the data store or cloud service. This paper presents the design and implementation of enhanced clients for improving both the functionality and performance of applications accessing data stores or cloud services. Our enhanced clients can improve performance via multiple types of caches, encrypt data for providing confidentiality before sending information to a server, and compress data for reducing the size of data transfers. Our clients can perform data analysis to allow applications to more effectively use cloud services. They also provide both synchronous and asynchronous interfaces. An asynchronous interface allows an application program to access a data store or cloud service and continue execution before receiving a response which can significantly improve performance. We present a Universal Data Store Manager (UDSM) which allows an application to access multiple different data stores and provides a common interface to each data store. The UDSM also can monitor the performance of different data stores. A workload generator allows users to easily determine and compare the performance of different data stores. We also present NLU-SA, an application for performing natural language understanding and sentiment analysis on text documents. NLU-SA is implemented on top of our enhanced clients and integrates text analysis with Web searching. We present results from NLU-SA on sentiment on the Web towards major companies and countries. We also present a performance analysis of our enhanced clients.
收起
摘要 :
Internet is a medium to connect millions of computers which share and access information all over the world. With the evolution of the web and its increased use in every aspect of life, the need for web security has become imperat...
展开
Internet is a medium to connect millions of computers which share and access information all over the world. With the evolution of the web and its increased use in every aspect of life, the need for web security has become imperative. As websites opt for commercial viability, the threat of hackers, viruses, or annoyance attacks becomes more pronounced. Organizations face several security-related challenges. If organizational information is hacked either through the network or through other means, it could incur a heavy cost to the company. A failure in network security could also cost the organization in terms of its goodwill and reputation. This paper identified common threats on the web and classified these threats into various categories, such as accidental, malicious, authorization, application, privacy, and access control threats. This also highlights the three main areas in which web can be secured ie. client side threats, server side threats and network side threats. This paper discusses the primary goals and objectives of security contained within the CIA Triad: Confidentiality, Integrity and Availability. Different types of attackers which are responsible for security of web are also depicted. This paper shows different attacks related to client side, server side and network side threats. Client-side Security threats are classified into: Cross Site Scripting, Cross Site Request Forgery, Broken Authentication and Session Management, Security Misconfiguration and Failure to Restrict URL Access. Server-side Security consists of Structured Query Language (SQL) Injection, Malicious File Execution, Insecure Direct Object Reference, Insecure Cryptographic Storage and Unvalidated Redirects and Forwards. The network threats highlighted are Denial of Service (DoS), Insufficient Transport Layer Protection, Eavesdropping, Data Modification, IP Address Spoofing, Sniffer attacks, Man-in-the-Middle Attack, Phishing, Brute force attack and TCP Session Hijacking. The paper shows the causes of each of the attacks and the web application metrics which were earlier defined are also highlighted. A metric named Web Application Security Metric (WASM) is proposed in this regard to make the web page secure. This metric calculates the sum of the weight of the categories like: Input validation, Authentication, Authorization, Configuration management, Sensitive data, Session management, Cryptography, Parameter manipulation, Exception management and Auditing and logging.
收起
摘要 :
Labour productivity growth is a necessary condition for social and economic progress, in general, and to overcome the economic crisis facing most of the world, in special. Applying innovative solutions, based on the ITC, is one of...
展开
Labour productivity growth is a necessary condition for social and economic progress, in general, and to overcome the economic crisis facing most of the world, in special. Applying innovative solutions, based on the ITC, is one of the straight ways for achieving that objective, both important and necessary. This paper presents a software solution applicable to industrial production based on numerically controlled machines. It involves a distributed client - server communication system, combined with MLP neural networks for the recognition of the 2D industrial objects, viewed from any angle. The information on prismatic and rotational parts to be processed by numerically controlled machine, are stored on a database server together with the corresponding processing programs. The client applications run on the numerically controlled machines and on the robots serving groups of machines. While the machines are fixed, the robots are mobile and can move from a machine to another. As a novelty of the proposed solution, in some well denned situations, the clients are allowed to change messages among them, in order to avoid the server overload. The neural networks are used to help robots to recognize the parts before and during manipulation.
收起
摘要 :
Predicting workload characteristics could help web systems achieve elastic scaling and reliability by optimizing servers' configuration and ensuring Quality of Service, such as increasing or decreasing used resources. However, a s...
展开
Predicting workload characteristics could help web systems achieve elastic scaling and reliability by optimizing servers' configuration and ensuring Quality of Service, such as increasing or decreasing used resources. However, a successful analysis using a simulation model and recognition and prediction of the behavior of the client presents a challenging task. Furthermore, the network traffic characteristic is a subject of frequent changes in modern web systems and the huge content of system logs makes it a difficult area for data mining research. In this work, we investigate prepared trace contents that are obtained from the benchmark of the web system. The article proposes traffic classification on the web system that is used to find the behavior of client classes. We present a case study involving workload analysis of an online stock trading application that is run in the cloud, and that processes requests from the designed generator. The results show that the proposed analysis could help us better understand the requests scenario and select the values of system and application parameters. Our work is useful for practitioners and researchers of log analysis to enhance service reliability.
收起
摘要 :
One of the key areas of Web geotechnology development is the implementation of software tools and systems, which are capable not only to display geospatial data in the Web interface, but also to provide functionality for processin...
展开
One of the key areas of Web geotechnology development is the implementation of software tools and systems, which are capable not only to display geospatial data in the Web interface, but also to provide functionality for processing and analysis directly in the browser window. Significant feature of current Web-based geo-spatial standards is the focusing on server-side data processing. Our study investigates possibilities and general ways of decentralized data processing implementation on the client side in the case of using the Java Web Start technology. The test software tools are developed that implements capability of transmitting the executable program code to the client computer through the Web interface and spatial data processing on the client side.
收起
摘要 :
As businesses continue to grow their World Wide Web presence, it is becoming increasingly vital for them to have quantitative measures of the mean client perceived response times of their web services. We present Certes (CliEnt Re...
展开
As businesses continue to grow their World Wide Web presence, it is becoming increasingly vital for them to have quantitative measures of the mean client perceived response times of their web services. We present Certes (CliEnt Response Time Estimated by the Server), an online server-based mechanism that allows web servers to estimate mean client perceived response time, as if measured at the client. Certes is based on a model of TCP that quantifies the effect that connection drops have on mean client perceived response time by using three simple server-side measurements: connection drop rate, connection accept rate and connection completion rate. The mechanism does not require modifications to HTTP servers or web pages, does not rely on probing or third party sampling, and does not require client-side modifications or scripting. Certes can be used to estimate response times for any web content, not just HTML. We have implemented Certes and compared its response time estimates with those obtained with detailed client instrumentation. Our results demonstrate that Certes provides accurate server-based estimates of mean client response times in HTTP 1.0/1.1 environments, even with rapidly changing workloads. Certes runs online in constant time with very low overhead. It can be used at websites and server farms to verify compliance with service level objectives.
收起
摘要 :
Advances in computing and communication technologies have resulted in a wide variety of networked mobile devices that access data over the Internet. In this paper, we argue that servers by themselves may not be able to handle this...
展开
Advances in computing and communication technologies have resulted in a wide variety of networked mobile devices that access data over the Internet. In this paper, we argue that servers by themselves may not be able to handle this diversity in client characteristics and so intermediaries, such as proxies, should be employed to handle the mismatch between the server-supplied data and the client capabilities. Since existing proxies are primarily designed to handle traditional wired hosts, such proxy architectures will need to be enhanced to handle mobile devices. We propose such an enhanced proxy architecture that is capable of handling the heterogeneity in client needs―specifically the variations in client bandwidth and display capabilities. Our architecture combines transcoding (which is used to match the fidelity of the requested object to client capabilities) and caching (which is used to reduce the latency for accessing popular objects). Proxies that Transcode and Cache, PTCs, intelligently adapt to prevailing system conditions using learning techniques to decide whether to transcode locally or fetch an appropriate version from the server. Our experimental results indicate that the use of PTCs produces significant improvements in the client response times. We show that such results hold true for a variety of data content types like images and video data. Further, we find that even simple learning techniques can lead to significant performance improvements.
收起
摘要 :
An e-mail listserv was used to recruit participants, administer a survey, send electronic gift certificates, and disseminate the findings of a study of psychological responses to being a carrier of the gene for phenylketonuria, a ...
展开
An e-mail listserv was used to recruit participants, administer a survey, send electronic gift certificates, and disseminate the findings of a study of psychological responses to being a carrier of the gene for phenylketonuria, a rare genetic disease. The majority of responses to the call for participants were received within the first 24 hours, and most surveys (n = 83) were returned via e-mail within 5 days. The use of e-mail allowed more opportunities for researcher-participant interaction than Web-based surveys, but the return rate of 51% may reflect concerns about privacy when e-mail addresses are required.
收起